63 research outputs found

    Using genetic algorithms to create meaningful poetic text

    Get PDF
    Work carried out when all authors were at the University of Edinburgh.Peer reviewedPostprin

    LINKEDLAB: A DATA MANAGEMENT PLATFORM FOR RESEARCH COMMUNITIES USING LINKED DATA APPROACH

    Get PDF
    Data management has a key role on how we access, organize, and integrate data. Research community is one of the domain on which data is disseminated, e.g., projects, publications, and members.There is no well-established standard for doing so, and therefore the value of the data decreases, e.g. in terms of accessibility, discoverability, and reusability. LinkedLab proposes a platform to manage data for research communites using Linked Data technique. The use of Linked Data affords a more effective way to access, organize, and integrate the data. Manajemen data memilki peranan kunci dalam bagaimana kita mengakses, mengatur, dan mengintegrasikan data. Komunitas riset adalah salah satu domain dimana data disebarkan, contohnyadistribusi data dalam proyek, publikasi dan anggota. Tidak ada standar yang mengatur distribusi data selama ini.Oleh karena itu,value dari data cenderung menurun, contohnya dalam konteksaccessibility, discoverability, dan usability. LinkedLab merupakan sebuah usulanplatform untuk mengelola data untuk komunitas riset dengan menggunakan teknik Linked Data. Kegunaan Linked Data adalah sebuah cara yang efektif untuk mengakses, mengatur, dan mengitegrasikan data

    Gated Recurrent Neural Tensor Network

    Full text link
    Recurrent Neural Networks (RNNs), which are a powerful scheme for modeling temporal and sequential data need to capture long-term dependencies on datasets and represent them in hidden layers with a powerful model to capture more information from inputs. For modeling long-term dependencies in a dataset, the gating mechanism concept can help RNNs remember and forget previous information. Representing the hidden layers of an RNN with more expressive operations (i.e., tensor products) helps it learn a more complex relationship between the current input and the previous hidden layer information. These ideas can generally improve RNN performances. In this paper, we proposed a novel RNN architecture that combine the concepts of gating mechanism and the tensor product into a single model. By combining these two concepts into a single RNN, our proposed models learn long-term dependencies by modeling with gating units and obtain more expressive and direct interaction between input and hidden layers using a tensor product on 3-dimensional array (tensor) weight parameters. We use Long Short Term Memory (LSTM) RNN and Gated Recurrent Unit (GRU) RNN and combine them with a tensor product inside their formulations. Our proposed RNNs, which are called a Long-Short Term Memory Recurrent Neural Tensor Network (LSTMRNTN) and Gated Recurrent Unit Recurrent Neural Tensor Network (GRURNTN), are made by combining the LSTM and GRU RNN models with the tensor product. We conducted experiments with our proposed models on word-level and character-level language modeling tasks and revealed that our proposed models significantly improved their performance compared to our baseline models.Comment: Accepted at IJCNN 2016 URL : http://ieeexplore.ieee.org/document/7727233

    An Implementation of a Flexible Author-Reviewer Model of Generation using Genetic Algorithms

    Get PDF
    PACLIC / The University of the Philippines Visayas Cebu College Cebu City, Philippines / November 20-22, 200

    Extending an Indonesian Semantic Analysis-based Question Answering System with Linguistic and World Knowledge Axioms

    Get PDF
    PACLIC / The University of the Philippines Visayas Cebu College Cebu City, Philippines / November 20-22, 200

    Developing an Online Indonesian Corpora Repository

    Get PDF

    A chart generation system for topical metrical poetry

    Get PDF
    Abstract Several poetry generation systems that are in some way inspired or motivated by existing articles such as newspaper stories have recently appeared. However, most if not all of them employ template-based generation, which limits both the expressiveness of the system and the ability to faithfully convey the message of the source article. In this paper we present our work on a poetry generation system that uses a dependency parser to extract the predicate argument structure of the input article, and tries to maintain this structure through deep syntactic text generation whilst complying with a given target form. The combinatorial nature of this task presents huge challenges, and we describe several improvements that have been applied in an attempt to produce poetry in a tractable fashion

    Texture fusion for batik motif retrieval system

    Get PDF
    This paper systematically investigates the effect of image texture features on batik motif retrieval performance. The retrieval process uses a query motif image to find matching motif images in a database. In this study, feature fusion of various image texture features such as Gabor, Log-Gabor, Grey Level Co-Occurrence Matrices (GLCM), and Local Binary Pattern (LBP) features are attempted in motif image retrieval. With regards to performance evaluation, both individual features and fused feature sets are applied. Experimental results show that optimal feature fusion outperforms individual features in batik motif retrieval. Among the individual features tested, Log-Gabor features provide the best result. The proposed approach is best used in a scenario where a query image containing multiple basic motif objects is applied to a dataset in which retrieved images also contain multiple motif objects. The retrieval rate achieves 84.54% for the rank 3 precision when the feature space is fused with Gabor, GLCM and Log-Gabor features. The investigation also shows that the proposed method does not work well for a retrieval scenario where the query image contains multiple basic motif objects being applied to a dataset in which the retrieved images only contain one basic motif object
    • ā€¦
    corecore